Least Angle and L1 Regression: A Review
نویسندگان
چکیده
Least Angle Regression is a promising technique for variable selection applications, offering a nice alternative to stepwise regression. It provides an explanation for the similar behavior of LASSO (L1-penalized regression) and forward stagewise regression, and provides a fast implementation of both. The idea has caught on rapidly, and sparked a great deal of research interest. In this paper, we give an overview of Least Angle Regression and the current state of related research. AMS 2000 subject classifications: Primary 62J07; secondary 69J99.
منابع مشابه
Least angle and l 1 penalized regression : A review ∗ †
Least Angle Regression is a promising technique for variable selection applications, offering a nice alternative to stepwise regression. It provides an explanation for the similar behavior of LASSO (l1-penalized regression) and forward stagewise regression, and provides a fast implementation of both. The idea has caught on rapidly, and sparked a great deal of research interest. In this paper, w...
متن کاملEfficient L1 Regularized Logistic Regression
L1 regularized logistic regression is now a workhorse of machine learning: it is widely used for many classification problems, particularly ones with many features. L1 regularized logistic regression requires solving a convex optimization problem. However, standard algorithms for solving convex optimization problems do not scale well enough to handle the large datasets encountered in many pract...
متن کاملForward stagewise regression and the monotone lasso
Abstract: We consider the least angle regression and forward stagewise algorithms for solving penalized least squares regression problems. In Efron, Hastie, Johnstone & Tibshirani (2004) it is proved that the least angle regression algorithm, with a small modification, solves the lasso regression problem. Here we give an analogous result for incremental forward stagewise regression, showing tha...
متن کاملLeast Squares Optimization with L1-Norm Regularization
This project surveys and examines optimization approaches proposed for parameter estimation in Least Squares linear regression models with an L1 penalty on the regression coefficients. We first review linear regression and regularization, and both motivate and formalize this problem. We then give a detailed analysis of 8 of the varied approaches that have been proposed for optimizing this objec...
متن کاملPoints with large α-depth
We show that for any ǫ > 0 there exists an angle α = α(ǫ) between 0 and π, depending only on ǫ, with the following two properties: (1) For any continuous probability measure in the plane one can find two lines l1 and l2, crossing at an angle of (at least) α, such that the measure of each of the two opposite quadrants of angle π − α, determined by l1 and l2, is at least 1 2 − ǫ. (2) For any set ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008